Cross-Modality High-Frequency Transformer for MR Image Super-Resolution

👤 Chaowei Fang, Dingwen Zhang, Lechao Cheng, Junwei Han
📅 July 2022
ACM Multimedia 2022 Conference paper

Abstract

Improving the resolution of magnetic resonance (MR) image data is critical to computer-aided diagnosis and brain function analysis. Higher resolution helps to capture more detailed content, but typically induces to lower signal-to-noise ratio and longer scanning time. To this end, MR image super-resolution has become a widely-interested topic in recent times.

Existing works establish extensive deep models with the conventional architectures based on convolutional neural networks (CNN). In this work, to further advance this research field, we make an early effort to build a Transformer-based MR image super-resolution framework, with careful designs on exploring valuable domain prior knowledge.

Methodology

Specifically, we consider two-fold domain priors:

1. High-frequency structure prior: Capturing detailed structural information that is critical for medical image analysis.

2. Inter-modality context prior: Leveraging information from multiple MR imaging modalities to enhance super-resolution quality.

We establish a novel Transformer architecture, called Cross-modality high-frequency Transformer (Cohf-T), to introduce such priors into super-resolving the low-resolution (LR) MR images. The framework effectively integrates cross-modality information with high-frequency features to produce high-quality super-resolved images.

Experimental Results

Comprehensive experiments on two datasets indicate that Cohf-T achieves new state-of-the-art performance in MR image super-resolution tasks.

The results demonstrate the effectiveness of incorporating domain-specific priors and Transformer architecture for medical image super-resolution, opening new possibilities for clinical applications.

Keywords: Deep Learning High-Frequency Super-Resolution MR Image Medical Imaging Transformer Cross-Modality

📚 Cite This Work

Choose how you would like to access the BibTeX citation: